Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
1.
J Surg Res ; 288: 290-297, 2023 08.
Artículo en Inglés | MEDLINE | ID: mdl-37058985

RESUMEN

INTRODUCTION: There are many barriers to the implementation of an enhanced recovery after surgery (ERAS) pathway. The aim of this study was to compare surgeon and anesthesia perceptions with current practices prior to the initiation of an ERAS protocol in pediatric colorectal patients and to use that information to inform ERAS implementation. METHODS: This was a mixed method single institution study of barriers to implementation of an ERAS pathway at a free-standing children's hospital. Anesthesiologists and surgeons at a free-standing children's hospital were surveyed regarding current practices of ERAS components. A retrospective chart review was performed of 5- to 18-y-old patients undergoing colorectal procedures between 2013 and 2017, followed by the initiation of an ERAS pathway, and a prospective chart review for 18 mo postimplementation. RESULTS: The response rate was 100% (n = 7) for surgeons and 60% (n = 9) for anesthesiologists. Preoperative nonopioid analgesics and regional anesthesia were rarely used. Intraoperatively, 54.7% of patients had a fluid balance of <10 cc/kg/h and normothermia was achieved in only 38.7%. Mechanical bowel prep was frequently utilized (48%). Median nil per os time was significantly longer than required at 12 h. Postoperatively, 42.9% of surgeons reported that patients could have clears on postoperative day zero, 28.6% on postoperative day one, and 28.6% after flatus. In reality, 53.3% of patients were started on clears after flatus, with a median time of 2 d. Most surgeons (85.7%) expected patients to get out of bed once awake from anesthesia; however, median time that patients were out of bed was postoperative day one. While most surgeons reported frequent use of acetaminophen and/or ketorolac, only 69.3% received any nonopioid analgesic postoperatively, with only 41.3% receiving two or more nonopioid analgesics. Nonopioid analgesia showed the highest rates of improvement from retrospective to prospective: preoperative use of analgesics increased from 5.3% to 41.2% (P < 0.0001), postoperative use of acetaminophen increased by 27.4% (P = 0.5), Toradol by 45.5% (P = 0.11), and gabapentin by 86.7% (P < 0.0001). Postoperative nausea/vomiting prophylaxis with >1 class of antiemetic increased from 8% to 47.1% (P < 0.001). The length of stay was unchanged (5.7 versus 4.4 d, P = 0.14). CONCLUSIONS: For the successful implementation of an ERAS protocol, perceptions versus reality must be assessed to determine current practices and identify barriers to implementation.


Asunto(s)
Analgésicos no Narcóticos , Neoplasias Colorrectales , Recuperación Mejorada Después de la Cirugía , Humanos , Niño , Analgésicos no Narcóticos/uso terapéutico , Acetaminofén , Estudios Retrospectivos , Estudios Prospectivos , Flatulencia/tratamiento farmacológico , Dolor Postoperatorio/tratamiento farmacológico , Neoplasias Colorrectales/tratamiento farmacológico , Tiempo de Internación
2.
J Diet Suppl ; 20(1): 118-131, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-34219586

RESUMEN

The warm season essential oil producing grass species including lemongrass (Cymbopogon citratus), palmarosa grass (C. martini), geranium grass (C. schoenanthus), vetiver grass (Chrysopogon zizanioides), and scented top grass (Capillipedium parviflorum) are used worldwide for their cosmetic and health properties. A discussion providing evidence from literature reviews about the potential uses of these grass species for antimicrobial and other health uses are presented. These species could be used as new therapies for treating microbial infections. The purpose of this study is to discuss in detail, evidence from literature reviews supporting potential health uses and to provide some discussion regarding some agronomic traits for these essential oil producing species.


Asunto(s)
Antiinfecciosos , Chrysopogon , Aceites Volátiles , Biodegradación Ambiental , Antiinfecciosos/farmacología
3.
J Diet Suppl ; 20(3): 475-484, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-34996311

RESUMEN

Butterfly pea (Clitoria ternatea L.) is a legume used as tea, forage, ornamental, salad, and medicinal plant. The flowers range from white to dark purple with little known about the variation for seed and flower color in the United States Department of Agriculture, Agricultural Research Service, Plant Genetic Resources Conservation Unit germplasm collection. Therefore, 26 butterfly pea accessions were analyzed using a principal component analysis (PCA) and average linkage cluster analysis (ALCA). These butterfly pea genotypes ranged from 56% to 99% for viabilities, 2.57 to 5.88 g for 100 seed weight, 34.07 to 226.26 g for total seed weight, and 1,326 to 3,874 for total seed numbers. PCA accounted for 40%, 57%, 70%, 79%, and 86% of the variation using principal components (PCs) 1 through 5, respectively. PC1 was most correlated with 100 and total seed weight, while PC2 correlated with blue, white, and purple flowers. PC3 correlated mostly with germination, purple flowers, and total seed weight. PCs 4 and 5 primarily correlated with blue and purple flowers, respectively. Several significant correlations were also observed. ALCA grouped the 26 butterfly pea genotypes into four distinct seed number-producing clusters. Clusters 1 to 4 represent the lowest to highest seed numbers produced by the butterfly pea genotypes. Several potential health benefits from butterfly pea flowers, leaves, seeds, and roots for human use were identified from the literature.


Asunto(s)
Clitoria , Estados Unidos , Humanos , Pisum sativum/genética , Extractos Vegetales , Suplementos Dietéticos , Análisis Multivariante
4.
J Diet Suppl ; 20(5): 673-688, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-35615864

RESUMEN

Blackeye peas (Vigna unguiculata L. Walp.) are mainly used as a vegetable throughout the world, however they may contain significant concentrations of quercetin, myricetin, cyanidin, and delphinidin for potential use as a functional vegetable. Thirty-eight blackeye pea genotypes were selected from the core collection in the USDA, ARS, Plant Genetic Resources Conservation Unit's cold storage at 4 °C during 2016. Information regarding concentrations of quercetin, myricetin, cyanidin, delphindin, and correlations among these as well as additional seed traits including seed coat color, seed pattern color, seed pattern, seed texture, and years in storage would add value to the blackeye pea genotypes for use as a functional vegetable. Using high performance liquid chromatography (HPLC), the red seeded accession originating from Mozambique, PI 367927 produced the highest quercetin (469.53 µg/g) and myricetin (212.23 µg/g) concentrations. The black seeded genotype, PI 353236, originating from India, produced the highest cyanidin (1,388.82 µg/g) concentration. However, PI 353236 and the brown seeded genotype, PI 353352 from India produced the highest concentrations of delphinidin (1,343.27 and 1,353.94 µg/g), respectively. Several correlations were observed and interestingly only delphinidin showed a significant negative correlation (r = -0.293*) with years in cold storage indicating that delphinidin declined in the seeds stored the longest (from 4-45 years) at 4 °C. Principal component analysis (PCA) explained how the flavonols, anthocyanidins, and the additional seed traits contributed to the variation of the blackeye pea genotypes. The cluster analysis showed six clusters representing low to high phytochemical concentrations. The genetic parameters including σ2g, σ2p, GCV, PCV, h2h, and GG indicate that improvement in these phytochemical traits is possible through selection. The genotypic and phenotypic correlations showed that improving one phytochemical significantly improved the other except for cyanidin with delphinidin. These results can be used by scientists to develop blackeye pea cultivars with high flavonol and anthocyanidin concentrations.


Asunto(s)
Antocianinas , Vigna , Pisum sativum/genética , Verduras , Quercetina , Flavonoles/análisis , Genotipo , Fitoquímicos , Variación Genética
5.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-22281024

RESUMEN

Age is a major risk factor for hospitalization and death after SARS-CoV-2 infection, even in vaccinees. Suboptimal responses to a primary vaccination course have been reported in the elderly, but there is little information regarding the impact of age on responses to booster third doses. Here we show that individuals 70 or older who received a primary two dose schedule with AZD1222 and booster third dose with mRNA vaccine achieved significantly lower neutralizing antibody responses against SARS-CoV-2 spike pseudotyped virus compared to those younger than 70. One month after the booster neither the concentration of serum binding anti spike IgG antibody, nor the frequency of spike-specific B cells showed differences by age grouping. However, the impaired neutralization potency and breadth post-third dose in the elderly was associated with enrichment of circulating "atypical" spike-specific B cells expressing CD11c and FCRL5. Single cell RNA sequencing confirmed an expansion of TBX21-, ITGAX-expressing B cells in the elderly that enriched for B cell activation/receptor signalling pathway genes. Importantly we also observed impaired T cell responses to SARS-CoV-2 spike peptides in the elderly post-booster, both in terms of IFNgamma and IL2 secretion, as well as a decrease in T cell receptor signalling pathway genes. This expansion of atypical B cells and impaired T cell responses may contribute to the generation of less affinity-matured antibodies, with lower neutralizing capacity post-third dose in the elderly. Altogether, our data reveal the extent and potential mechanistic underpinning of the impaired vaccine responses present in the elderly after a booster dose, contributing to their increased susceptibility to COVID-19 infection.

6.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-22276437

RESUMEN

The biology driving individual patient responses to SARS-CoV-2 infection remains ill understood. Here, we developed a patient-centric framework leveraging detailed longitudinal phenotyping data, covering a year post disease onset, from 215 SARS-CoV-2 infected subjects with differing disease severities. Our analyses revealed distinct "systemic recovery" profiles with specific progression and resolution of the inflammatory, immune, metabolic and clinical responses, over weeks to several months after infection. In particular, we found a strong intra-patient temporal covariation of innate immune cell numbers, kynurenine- and host lipid-metabolites, which suggested candidate immunometabolic pathways putatively influencing restoration of homeostasis, the risk of death and of long COVID. Based on these data, we identified a composite signature predictive of systemic recovery on the patient level, using a joint model on cellular and molecular parameters measured soon after disease onset. New predictions can be generated using the online tool http://shiny.mrc-bsu.cam.ac.uk/apps/covid-systemic-recovery-prediction-app, designed to test our findings prospectively. Graphical abstract O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=122 SRC="FIGDIR/small/22276437v1_ufig1.gif" ALT="Figure 1"> View larger version (38K): org.highwire.dtl.DTLVardef@e410cforg.highwire.dtl.DTLVardef@10bad79org.highwire.dtl.DTLVardef@1a9ebadorg.highwire.dtl.DTLVardef@afb5f7_HPS_FORMAT_FIGEXP M_FIG C_FIG

7.
Preprint en Inglés | bioRxiv | ID: ppbiorxiv-491004

RESUMEN

Over 20 mutations have been identified in the N-Terminal Domain (NTD) of SARS-CoV-2 spike and yet few of them are fully characterised. Here we first examined the contribution of the NTD to infection and cell-cell fusion by constructing different VOC-based chimeric spikes bearing B.1617 lineage (Delta and Kappa variants) NTDs and generating spike pseudotyped lentivirus (PV). We found the Delta NTD on a Kappa or WT background increased spike S1/S2 cleavage efficiency and virus entry, specifically in Calu-3 lung cells and airway organoids, through use of TMPRSS2. We have previously shown Delta spike confers rapid cell-cell fusion kinetics; here we show that increased fusogenicity can be conferred to WT and Kappa variant spikes by transfer of the Delta NTD. Moving to contemporary variants, we found that BA.2 had higher entry efficiency in a range of cell types as compared to BA.1. BA.2 showed higher fusogenic activity than BA.1, but the BA.2 NTD could not confer higher fusion to BA.1 spike. There was low efficiency of TMPRSS2 usage by both BA.1 and BA.2, and chimeras of Omicron BA.1 and BA.2 spikes with a Delta NTD did not result in more efficient use of TMRPSS2 or cell-cell fusogenicity. We conclude that the NTD allosterically modulates S1/S2 cleavage and spike-mediated functions such as entry and cell-cell fusion in a spike context dependent manner, and allosteric interactions may be lost when combining regions from more distantly related spike proteins. These data may explain the lack of successful SARS-CoV-2 inter-variant recombinants bearing breakpoints within spike.

8.
Entropy (Basel) ; 24(10)2022 Sep 29.
Artículo en Inglés | MEDLINE | ID: mdl-37420411

RESUMEN

An overview is presented of several diverse branches of work in the area of effectively 2D fluid equilibria which have in common that they are constrained by an infinite number of conservation laws. Broad concepts, and the enormous variety of physical phenomena that can be explored, are highlighted. These span, roughly in order of increasing complexity, Euler flow, nonlinear Rossby waves, 3D axisymmetric flow, shallow water dynamics, and 2D magnetohydrodynamics. The classical field theories describing these systems bear some resemblance to perhaps more familiar fluctuating membrane and continuous spin models, but the fluid physics drives these models into unconventional regimes exhibiting large scale jet and eddy structures. From a dynamical point of view these structures are the end result of various conserved variable forward and inverse cascades. The resulting balance between large scale structure and small scale fluctuations is controlled by the competition between energy and entropy in the system free energy, in turn highly tunable through setting the values of the conserved integrals. Although the statistical mechanical description of such systems is fully self-consistent, with remarkable mathematical structure and diversity of solutions, great care must be taken because the underlying assumptions, especially ergodicity, can be violated or at minimum lead to exceedingly long equilibration times. Generalization of the theory to include weak driving and dissipation (e.g., non-equilibrium statistical mechanics and associated linear response formalism) could provide additional insights, but has yet to be properly explored.

9.
Preprint en Inglés | bioRxiv | ID: ppbiorxiv-473248

RESUMEN

The SARS-CoV-2 Omicron BA.1 variant emerged in late 2021 and is characterised by multiple spike mutations across all spike domains. Here we show that Omicron BA.1 has higher affinity for ACE2 compared to Delta, and confers very significant evasion of therapeutic monoclonal and vaccine-elicited polyclonal neutralising antibodies after two doses. mRNA vaccination as a third vaccine dose rescues and broadens neutralisation. Importantly, antiviral drugs remdesevir and molnupiravir retain efficacy against Omicron BA.1. We found that in human nasal epithelial 3D cultures replication was similar for both Omicron and Delta. However, in lower airway organoids, Calu-3 lung cells and gut adenocarcinoma cell lines live Omicron virus demonstrated significantly lower replication in comparison to Delta. We noted that despite presence of mutations predicted to favour spike S1/S2 cleavage, the spike protein is less efficiently cleaved in live Omicron virions compared to Delta virions. We mapped the replication differences between the variants to entry efficiency using spike pseudotyped virus (PV) entry assays. The defect for Omicron PV in specific cell types correlated with higher cellular RNA expression of TMPRSS2, and accordingly knock down of TMPRSS2 impacted Delta entry to a greater extent as compared to Omicron. Furthermore, drug inhibitors targeting specific entry pathways demonstrated that the Omicron spike inefficiently utilises the cellular protease TMPRSS2 that mediates cell entry via plasma membrane fusion. Instead, we demonstrate that Omicron spike has greater dependency on cell entry via the endocytic pathway requiring the activity of endosomal cathepsins to cleave spike. Consistent with suboptimal S1/S2 cleavage and inability to utilise TMPRSS2, syncytium formation by the Omicron spike was dramatically impaired compared to the Delta spike. Overall, Omicron appears to have gained significant evasion from neutralising antibodies whilst maintaining sensitivity to antiviral drugs targeting the polymerase. Omicron has shifted cellular tropism away from TMPRSS2 expressing cells that are enriched in cells found in the lower respiratory and GI tracts, with implications for altered pathogenesis.

10.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-21260360

RESUMEN

Prominent early features of COVID-19 include severe, often clinically silent, hypoxia and a pronounced reduction in B cells, the latter important in defence against SARS-CoV-2. This brought to mind the phenotype of mice with VHL-deficient B cells, in which Hypoxia-Inducible Factors are constitutively active, suggesting hypoxia might drive B cell abnormalities in COVID-19. We demonstrated the breadth of early and persistent defects in B cell subsets in moderate/severe COVID-19, including reduced marginal zone-like, memory and transitional B cells, changes we also observed in B cell VHL-deficient mice. This was corroborated by hypoxia-related transcriptional changes in COVID-19 patients, and by similar B cell abnormalities in mice kept in hypoxic conditions, including reduced marginal zone and germinal center B cells. Thus hypoxia might contribute to B cell pathology in COVID-19, and in other hypoxic states. Through this mechanism it may impact on COVID-19 outcome, and be remediable through early oxygen therapy.

11.
Gait Posture ; 87: 87-94, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-33895636

RESUMEN

BACKGROUND: There is a common perception that poorly fitting footwear will negatively impact a child's foot, however, there is limited evidence to support this. AIM: To determine the effect of shoe size on foot motion, perceived footwear comfort and fit during walking, maximal vertical jump height and maximal standing broad jump distance in children aged 8-12 years. METHODS: Fourteen participants completed 3D walking gait analysis and jumping tasks in three different sizes of school shoes (one size bigger, fitted for size, one size smaller). In-shoe motion of the hindfoot, midfoot and 1st metatarsophalangeal joint (1st MTPJ) were calculated using a multi-segment kinematic foot model. Physical performance measures were calculated via maximal vertical jump and maximal standing broad jump. Perceived footwear comfort and fit (heel, toes and overall) was assessed using a 100 mm visual analog scale (VAS). Differences were compared between shoe sizes using repeated measures ANOVA, post-hoc tests and effect sizes (Cohen's d). RESULTS: Compared to the fitted footwear, the smaller sizing restricted hindfoot eversion (-2.5°, p = 0.021, d = 0.82), 1st MTPJ dorsiflexion (-3.9°, p = 0.012, d = 0.54), and compared to the bigger footwear, smaller sizing restricted sagittal plane midfoot range-of-motion during walking (-2.5°, p = 0.047, d = 0.59). The fitted footwear was rated as more comfortable overall with the smaller size rated as too tight in both the heel (mean difference 11.5 mm, p = 0.042, d = 0.58) and toes (mean difference 12.1 mm, p = 0.022, d = 0.59), compared to the fitted size. Vertical and standing broad jump distance were not impacted by footwear size (p = 0.218-0.836). SIGNIFICANCE: Footwear that is too small restricts foot motion during walking in children aged 8-12 years. Jump performance was not affected. Children were able to recognise shoes that were not correctly matched to their foot length, reinforcing that comfort is an important part of the fitting process.


Asunto(s)
Ortesis del Pié , Pie , Instituciones Académicas , Zapatos , Fenómenos Biomecánicos , Niño , Humanos , Caminata
12.
Biomolecules ; 11(4)2021 03 26.
Artículo en Inglés | MEDLINE | ID: mdl-33810574

RESUMEN

Physical sedentarism is linked to elevated levels of circulating cytokines, whereas exercise upregulates growth-promoting proteins such as brain-derived neurotrophic factor (BDNF). The shift towards a 'repair' phenotype could protect against neurodegeneration, especially in diseases such as multiple sclerosis (MS). We investigated whether having higher fitness or participating in an acute bout of maximal exercise would shift the balance of BDNF and interleukin-6 (IL-6) in serum samples of people with progressive MS (n = 14), compared to matched controls (n = 8). Participants performed a maximal graded exercise test on a recumbent stepper, and blood samples were collected at rest and after the test. We assessed walking speed, fatigue, and maximal oxygen consumption (V·O2max). People with MS achieved about 50% lower V·O2max (p = 0.003) than controls. At rest, there were no differences in BDNF between MS and controls; however, IL-6 was significantly higher in MS. Higher V·O2max was associated with a shift in BDNF/IL-6 ratio from inflammation to repair (R = 0.7, p = 0.001) when considering both groups together. In the MS group, greater ability to upregulate BDNF was associated with faster walking speed and lower vitality. We present evidence that higher fitness indicates a shift in the balance of blood biomarkers towards a repair phenotype in progressive MS.


Asunto(s)
Factor Neurotrófico Derivado del Encéfalo/sangre , Ejercicio Físico , Interleucina-6/sangre , Esclerosis Múltiple/patología , Adulto , Biomarcadores/sangre , Estudios de Casos y Controles , Femenino , Humanos , Masculino , Persona de Mediana Edad , Esclerosis Múltiple/metabolismo , Consumo de Oxígeno
13.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-21251054

RESUMEN

Two dose mRNA vaccination provides excellent protection against SARS-CoV-2. However, there are few data on vaccine efficacy in elderly individuals above the age of 801. Additionally, new variants of concern (VOC) with reduced sensitivity to neutralising antibodies have raised fears for vulnerable groups. Here we assessed humoral and cellular immune responses following vaccination with mRNA vaccine BNT162b22 in elderly participants prospectively recruited from the community and younger health care workers. Median age was 72 years and 51% were females amongst 140 participants. Neutralising antibody responses after the first vaccine dose diminished with increasing age, with a marked drop in participants over 80 years old. Sera from participants below and above 80 showed significantly lower neutralisation potency against B.1.1.7, B.1.351 and P.1. variants of concern as compared to wild type. Those over 80 were more likely to lack any neutralisation against VOC compared to younger participants following first dose. The adjusted odds ratio for inadequate neutralisation activity against the B.1.1.7, P.1 and B.1.351 variant in the older versus younger age group was 4.3 (95% CI 2.0-9.3, p<0.001), 6.7 (95% CI 1.7-26.3, p=0.008) and 1.7 (95% CI 0.5-5.7, p=0.41). Binding IgG and IgA antibodies were lower in the elderly, as was the frequency of SARS-CoV-2 Spike specific B-memory cells. We observed a trend towards lower somatic hypermutation in participants with suboptimal neutralisation, and elderly participants demonstrated clear reduction in class switched somatic hypermutation, driven by the IgA1/2 isotype. SARS-CoV-2 Spike specific T-cell IFN{gamma} and IL-2 responses fell with increasing age, and both cytokines were secreted primarily by CD4 T cells. We conclude that the elderly are a high risk population that warrant specific measures in order to mitigate against vaccine failure, particularly where variants of concern are circulating.

14.
Genes (Basel) ; 12(1)2021 01 18.
Artículo en Inglés | MEDLINE | ID: mdl-33477542

RESUMEN

Understanding the genomic and environmental basis of cold adaptation is key to understand how plants survive and adapt to different environmental conditions across their natural range. Univariate and multivariate genome-wide association (GWAS) and genotype-environment association (GEA) analyses were used to test associations among genome-wide SNPs obtained from whole-genome resequencing, measures of growth, phenology, emergence, cold hardiness, and range-wide environmental variation in coastal Douglas-fir (Pseudotsuga menziesii). Results suggest a complex genomic architecture of cold adaptation, in which traits are either highly polygenic or controlled by both large and small effect genes. Newly discovered associations for cold adaptation in Douglas-fir included 130 genes involved in many important biological functions such as primary and secondary metabolism, growth and reproductive development, transcription regulation, stress and signaling, and DNA processes. These genes were related to growth, phenology and cold hardiness and strongly depend on variation in environmental variables such degree days below 0c, precipitation, elevation and distance from the coast. This study is a step forward in our understanding of the complex interconnection between environment and genomics and their role in cold-associated trait variation in boreal tree species, providing a baseline for the species' predictions under climate change.


Asunto(s)
Aclimatación/genética , Genes de Plantas , Polimorfismo de Nucleótido Simple , Pseudotsuga/genética , Estudio de Asociación del Genoma Completo
15.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-21249840

RESUMEN

Severe Acute Respiratory Syndrome Coronavirus-2 (SARS-CoV-2) transmission is uncontrolled in many parts of the world, compounded in some areas by higher transmission potential of the B1.1.7 variant now seen in 50 countries. It is unclear whether responses to SARS-CoV-2 vaccines based on the prototypic strain will be impacted by mutations found in B.1.1.7. Here we assessed immune responses following vaccination with mRNA-based vaccine BNT162b2. We measured neutralising antibody responses following a single immunization using pseudoviruses expressing the wild-type Spike protein or the 8 amino acid mutations found in the B.1.1.7 spike protein. The vaccine sera exhibited a broad range of neutralising titres against the wild-type pseudoviruses that were modestly reduced against B.1.1.7 variant. This reduction was also evident in sera from some convalescent patients. Decreased B.1.1.7 neutralisation was also observed with monoclonal antibodies targeting the N-terminal domain (9 out of 10), the Receptor Binding Motif (RBM) (5 out of 31), but not in neutralising mAbs binding outside the RBM. Introduction of the E484K mutation in a B.1.1.7 background to reflect newly emerging viruses in the UK led to a more substantial loss of neutralising activity by vaccine-elicited antibodies and mAbs (19 out of 31) over that conferred by the B.1.1.7 mutations alone. E484K emergence on a B.1.1.7 background represents a threat to the vaccine BNT162b.

16.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-20220699

RESUMEN

BackgroundThe COVID-19 pandemic continues to grow at an unprecedented rate. Healthcare workers (HCWs) are at higher risk of SARS-CoV-2 infection than the general population but risk factors for HCW infection are not well described. MethodsWe conducted a prospective sero-epidemiological study of HCWs at a UK teaching hospital using a SARS-CoV-2 immunoassay. Risk factors for seropositivity were analysed using multivariate logistic regression. Findings410/5,698 (7{middle dot}2%) staff tested positive for SARS-CoV-2 antibodies. Seroprevalence was higher in those working in designated COVID-19 areas compared with other areas (9{middle dot}47% versus 6{middle dot}16%) Healthcare assistants (aOR 2{middle dot}06 [95%CI 1{middle dot}14-3{middle dot}71]; p=0{middle dot}016) and domestic and portering staff (aOR 3{middle dot}45 [95% CI 1{middle dot}07-11{middle dot}42]; p=0{middle dot}039) had significantly higher seroprevalence than other staff groups after adjusting for age, sex, ethnicity and COVID-19 working location. Staff working in acute medicine and medical sub-specialities were also at higher risk (aOR 2{middle dot}07 [95% CI 1{middle dot}31-3{middle dot}25]; p<0{middle dot}002). Staff from Black, Asian and minority ethnic (BAME) backgrounds had an aOR of 1{middle dot}65 (95% CI 1{middle dot}32 - 2{middle dot}07; p<0{middle dot}001) compared to white staff; this increased risk was independent of COVID-19 area working. The only symptoms significantly associated with seropositivity in a multivariable model were loss of sense of taste or smell, fever and myalgia; 31% of staff testing positive reported no prior symptoms. InterpretationRisk of SARS-CoV-2 infection amongst HCWs is heterogeneous and influenced by COVID-19 working location, role, age and ethnicity. Increased risk amongst BAME staff cannot be accounted for solely by occupational factors. FundingWellcome Trust, Addenbrookes Charitable Trust, National Institute for Health Research, Academy of Medical Sciences, the Health Foundation and the NIHR Cambridge Biomedical Research Centre. Research in context Evidence before this studySpecific risk factors for SARS-CoV-2 infection in healthcare workers (HCWs) are not well defined. Additionally, it is not clear how population level risk factors influence occupational risk in defined demographic groups. Only by identifying these factors can we mitigate and reduce the risk of occupational SARS-CoV-2 infection. We performed a review of the evidence for HCW-specific risk factors for SARS-CoV-2 infection. We searched PubMed with the terms "SARS-CoV-2" OR "COVID-19" AND "Healthcare worker" OR "Healthcare Personnel" AND "Risk factor" to identify any studies published in any language between December 2019 and September 2020. The search identified 266 studies and included a meta-analysis and two observational studies assessing HCW cohort seroprevalence data. Seroprevalence and risk factors for HCW infections varied between studies, with contradictory findings. In the two serological studies, one identified a significant increased risk of seroprevalence in those working with COVID-19 patients (Eyre et al 2020), as well as associations with job role and department. The other study (Dimcheff et al 2020) found no significant association between seropositivity and any identified demographic or occupational factor. A meta-analysis of HCW (Gomez-Ochoa et al 2020) assessed >230,000 participants as a pooled analysis, including diagnoses by both RT-PCR and seropositivity for SARS-CoV-2 antibodies and found great heterogeneity in study design and reported contradictory findings. Of note, they report a seropositivity rate of 7% across all studies reporting SARS-CoV-2 antibodies in HCWs. Nurses were the most frequently affected healthcare personnel and staff working in non-emergency inpatient settings were the most frequently affected group. Our search found no prospective studies systematically evaluating HCW specific risk factors based entirely on seroprevalence data. Added value of this studyOur prospective cohort study of almost 6,000 HCWs at a large UK teaching hospital strengthens previous findings from UK-based cohorts in identifying an increased risk of SARS-CoV-2 exposure amongst HCWs. Specifically, factors associated with SARS-CoV-2 exposure include caring for confirmed COVID-19 cases and identifying as being within specific ethnic groups (BAME staff). We further delineated the risk amongst BAME staff and demonstrate that occupational factors alone do not account for all of the increased risk amongst this group. We demonstrate for the first time that healthcare assistants represent a key at-risk occupational group, and challenge previous findings of significantly higher risk amongst nursing staff. Seroprevalence in staff not working in areas with confirmed COVID-19 patients was only marginally higher than that of the general population within the same geographical region. This observation could suggest the increased risk amongst HCWs arises through occupational exposure to confirmed cases and could account for the overall higher seroprevalence in HCWs, rather than purely the presence of staff in healthcare facilities. Over 30% of seropositive staff had not reported symptoms consistent with COVID-19, and in those who did report symptoms, differentiating COVID-19 from other causes based on symptom data alone was unreliable. Implications of all the available evidenceInternational efforts to reduce the risk of SARS-CoV-2 infection amongst HCWs need to be prioritised. The risk of SARS-CoV-2 infection amongst HCWs is heterogenous but also follows demonstrable patterns. Potential mechanisms to reduce the risk for staff working in areas with confirmed COVID-19 patients include improved training in hand hygiene and personal protective equipment (PPE), better access to high quality PPE, and frequent asymptomatic testing. Wider asymptomatic testing in healthcare facilities has the potential to reduce spread of SARS-CoV-2 within hospitals, thereby reducing patient and staff risk and limiting spread between hospitals and into the wider community. The increased risk of COVID-19 amongst BAME staff cannot be explained by purely occupational factors; however, the increased risk amongst minority ethnic groups identified here was stark and necessitates further evaluation.

17.
JAMA Netw Open ; 3(7): e209393, 2020 07 01.
Artículo en Inglés | MEDLINE | ID: mdl-32663307

RESUMEN

Importance: Trauma is the leading cause of death for US individuals younger than 45 years, and uncontrolled hemorrhage is a major cause of trauma mortality. The US military's medical advancements in the field of prehospital hemorrhage control have reduced battlefield mortality by 44%. However, despite support from many national health care organizations, no integrated approach to research has been made regarding implementation, epidemiology, education, and logistics of prehospital hemorrhage control by layperson immediate responders in the civilian sector. Objective: To create a national research agenda to help guide future work for prehospital hemorrhage control by laypersons. Evidence Review: The 2-day, in-person, National Stop the Bleed (STB) Research Consensus Conference was conducted on February 27 to 28, 2019, to identify and achieve consensus on research gaps. Participants included (1) subject matter experts, (2) professional society-designated leaders, (3) representatives from the federal government, and (4) representatives from private foundations. Before the conference, participants were provided a scoping review on layperson prehospital hemorrhage control. A 3-round modified Delphi consensus process was conducted to determine high-priority research questions. The top items, with median rating of 8 or more on a Likert scale of 1 to 9 points, were identified and became part of the national STB research agenda. Findings: Forty-five participants attended the conference. In round 1, participants submitted 487 research questions. After deduplication and sorting, 162 questions remained across 5 a priori-defined themes. Two subsequent rounds of rating generated consensus on 113 high-priority, 27 uncertain-priority, and 22 low-priority questions. The final prioritized research agenda included the top 24 questions, including 8 for epidemiology and effectiveness, 4 for materials, 9 for education, 2 for global health, and 1 for health policy. Conclusions and Relevance: The National STB Research Consensus Conference identified and prioritized a national research agenda to support laypersons in reducing preventable deaths due to life-threatening hemorrhage. Investigators and funding agencies can use this agenda to guide their future work and funding priorities.


Asunto(s)
Servicios Médicos de Urgencia , Hemorragia , Proyectos de Investigación , Heridas y Lesiones , Investigación Biomédica/métodos , Consenso , Técnica Delphi , Servicios Médicos de Urgencia/métodos , Servicios Médicos de Urgencia/organización & administración , Hemorragia/etiología , Hemorragia/mortalidad , Hemorragia/terapia , Humanos , Encuestas y Cuestionarios , Heridas y Lesiones/complicaciones , Heridas y Lesiones/mortalidad
18.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-20133157

RESUMEN

BackgroundRapid COVID-19 diagnosis in hospital is essential for patient management and identification of infectious patients to limit the potential for nosocomial transmission. The diagnosis of infection is complicated by 30-50% of COVID-19 hospital admissions with nose/throat swabs testing negative for SARS-CoV-2 nucleic acid, frequently after the first week of illness when SARS-CoV-2 antibody responses become detectable. We assessed the diagnostic accuracy of combined rapid antibody point of care (POC) and nucleic acid assays for suspected COVID-19 disease in the emergency department. MethodsWe developed (i) an in vitro neutralization assay using a lentivirus expressing a genome encoding luciferase and pseudotyped with spike (S) protein and (ii) an ELISA test to detect IgG antibodies to nucleocapsid (N) and S proteins from SARS-CoV-2. We tested two lateral flow rapid fingerprick tests with bands for IgG and IgM. We then prospectively recruited participants with suspected moderate to severe COVID-19 and tested for SARS-CoV-2 nucleic acid in a combined nasal/throat swab using the standard laboratory RT-PCR and a validated rapid POC nucleic acid amplification (NAAT) test. Additionally, serum collected at admission was retrospectively tested by in vitro neutralisation, ELISA and the candidate POC antibody tests. We evaluated the performance of the individual and combined rapid POC diagnostic tests against a composite reference standard of neutralisation and standard laboratory based RT-PCR. Results45 participants had specimens tested for nucleic acid in nose/throat swabs as well as stored sera for antibodies. Using the composite reference standard, prevalence of COVID-19 disease was 53.3% (24/45). Median age was 73.5 (IQR 54.0-86.5) years in those with COVID-19 disease by our reference standard and 63.0 (IQR 41.0-72.0) years in those without disease. The overall detection rate by rapid NAAT was 79.2% (95CI 57.8-92.9%), decreasing from 100% (95% CI 65.3-98.6%) in days 1-4 to 50.0% (95% CI 11.8-88.2) for days 9-28 post symptom onset. Correct identification of COVID-19 with combined rapid POC diagnostic tests was 100% (95CI 85.8-100%) with a false positive rate of 5.3-14.3%, driven by POC LFA antibody tests. ConclusionsCombined POC tests have the potential to transform our management of COVID-19, including inflammatory manifestations later in disease where nucleic acid test results are negative. A rapid combined approach will also aid recruitment into clinical trials and in prescribing therapeutics, particularly where potentially harmful immune modulators (including steroids) are used.

19.
Preprint en Inglés | medRxiv | ID: ppmedrxiv-20114520

RESUMEN

BackgroundThere is urgent need for safe and efficient triage protocols for hospitalized COVID-19 suspects to appropriate isolation wards. A major barrier to timely discharge of patients from the emergency room and hospital is the turnaround time for many SARS-CoV-2 nucleic acid tests. We validated a point of care nucleic acid amplification based platform SAMBA II for diagnosis of COVID-19 and performed an implementation study to assess its impact on patient disposition at a major academic hospital. MethodsWe prospectively recruited COVID-19 suspects admitted to hospital (NCT04326387). In an initial pilot phase, individuals were tested using a nasal/throat swab with the SAMBA II SARS-CoV-2 rapid diagnostic platform in parallel with a combined nasal/throat swab for standard central laboratory RT-PCR testing. In the second implementation phase, we examined the utility of adding the SAMBA platform to routine care. In the pilot phase, we measured concordance and assay validity using the central laboratory as the reference standard and assessed assay turnaround time. In the implementation phase, we assessed 1) time to definitive bed placement from admission, 2) time spent on COVID-19 holding wards, 3) proportion of patients in isolation versus COVID negative areas following a test, comparing the implementation phase with the 10 days prior to implementation. ResultsIn phase I, 149 participants were included in the pilot. By central laboratory RT-PCR testing, 32 (21.5%) tested positive and 117 (78.5%). Sensitivity and specificity of the SAMBA assay compared to RT-PCR lab test were 96.9% (95% CI 0.838-0.999) and 99.1% (0.953-0.999), respectively. Median time to result was 2.6 hours (IQR 2.3 to 4.8) for SAMBA II SARS-CoV-2 test and 26.4 hours (IQR 21.4 to 31.4) for the standard lab RT-PCR test (p<0.001). In the first 10 days of the SAMBA implementation phase, we conducted 992 tests, with the majority (59.8%) used for hospital admission, and the remainder for pre-operative screening (11.3%), discharge planning (10%), in-hospital screening of new symptoms (9.7%). Comparing the pre-implementation (n=599) with the implementation phase, median time to definitive bed placement from admission was reduced from 23.4 hours (8.6-41.9) to 17.1 hours (9.0-28.8), P=0.02 in Cox analysis, adjusted for age, sex, comorbidities and clinical severity at presentation. Mean length of stay on a COVID-19 holding ward decreased from 58.5 hours to 29.9 hours (P<0.001). Use of single occupancy rooms amongst those tested fell from 30.8% before to 21.2% (P=0.03) and 11 hospital bay closures (on average 6 beds each) were avoided after implementation of the POC assay. ConclusionsThe SAMBA II SARS-CoV-2 rapid assay performed well compared to a centralized laboratory RT-PCR platform and demonstrated shorter time to result both in trial and real-world settings. It was also associated with faster time to definitive bed placement from the emergency room, greater availability of isolation rooms, avoidance of hospital bay closures, and greater movement of patients to COVID negative open "green" category wards. Rapid testing in hospitals has the potential to transform ability to deal with the COVID-19 epidemic.

20.
Ecol Evol ; 9(11): 6259-6275, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-31236219

RESUMEN

Variation in natural selection across heterogeneous landscapes often produces (a) among-population differences in phenotypic traits, (b) trait-by-environment associations, and (c) higher fitness of local populations. Using a broad literature review of common garden studies published between 1941 and 2017, we documented the commonness of these three signatures in plants native to North America's Great Basin, an area of extensive restoration and revegetation efforts, and asked which traits and environmental variables were involved. We also asked, independent of geographic distance, whether populations from more similar environments had more similar traits. From 327 experiments testing 121 taxa in 170 studies, we found 95.1% of 305 experiments reported among-population differences, and 81.4% of 161 experiments reported trait-by-environment associations. Locals showed greater survival in 67% of 24 reciprocal experiments that reported survival, and higher fitness in 90% of 10 reciprocal experiments that reported reproductive output. A meta-analysis on a subset of studies found that variation in eight commonly measured traits was associated with mean annual precipitation and mean annual temperature at the source location, with notably strong relationships for flowering phenology, leaf size, and survival, among others. Although the Great Basin is sometimes perceived as a region of homogeneous ecosystems, our results demonstrate widespread habitat-related population differentiation and local adaptation. Locally sourced plants likely harbor adaptations at rates and magnitudes that are immediately relevant to restoration success, and our results suggest that certain key traits and environmental variables should be prioritized in future assessments of plants in this region.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...